87 research outputs found

    The effect of emotion intensity on time perception: a study with transcranial random noise stimulation

    Get PDF
    Emotional facial expressions provide cues for social interactions and emotional events can distort our sense of time. The present study investigates the effect of facial emotional stimuli of anger and sadness on time perception. Moreover, to investigate the causal role of the orbitofrontal cortex (OFC) in emotional recognition, we employed transcranial random noise stimulation (tRNS) over OFC and tested the effect on participants' emotional recognition as well as on time processing. Participants performed a timing task in which they were asked to categorize as "short" or "long" temporal intervals marked by images of people expressing anger, sad or neutral emotional facial expressions. In addition, they were asked to judge if the image presented was of a person expressing anger or sadness. The visual stimuli were facial emotional stimuli indicating anger or sadness with different degrees of intensity at high (80%), medium (60%) and low (40%) intensity, along with neutral emotional face stimuli. In the emotional recognition task, results showed that participants were faster and more accurate when emotional intensity was higher. Moreover, tRNS over OFC interfered with emotion recognition, which is in line with its proposed role in emotion recognition. In the timing task, participants overestimated the duration of angry facial expressions, although neither emotional intensity not OFC stimulation significantly modulated this effect. Conversely, as the emotional intensity increased, participants exhibited a greater tendency to overestimate the duration of sad faces in the sham condition. However, this tendency disappeared with tRNS. Taken together, our results are partially consistent with previous findings showing an overestimation effect of emotionally arousing stimuli, revealing the involvement of OFC in emotional distortions of time, which needs further investigation

    Exploring manual asymmetries during grasping: a dynamic causal modeling approach

    Get PDF
    Recording of neural activity during grasping actions in macaques showed that grasp-related sensorimotor transformations are accomplished in a circuit constituted by the anterior part of the intraparietal sulcus (AIP), the ventral (F5) and the dorsal (F2) region of the premotor area. In humans, neuroimaging studies have revealed the existence of a similar circuit, involving the putative homolog of macaque areas AIP, F5 and F2. These studies have mainly considered grasping movements performed with the right dominant hand and only a few studies have measured brain activity associated with a movement performed with the left non-dominant hand. As a consequence of this gap, how the brain controls for grasping movement performed with the dominant and the non-dominant hand still represents an open question. A functional resonance imaging experiment (fMRI) has been conducted, and effective connectivity (Dynamic Causal Modelling, DCM) was used to assess how connectivity among grasping-related areas is modulated by hand (i.e., left and right) during the execution of grasping movements towards a small object requiring precision grasping. Results underlined boosted inter-hemispheric couplings between dorsal premotor cortices during the execution of movements performed with the left rather than the right dominant hand. More specifically, they suggest that the dorsal premotor cortices may play a fundamental role in monitoring the configuration of fingers when grasping movements are performed by either the right and the left hand. This role becomes particularly evident when the hand less-skilled (i.e., the left hand) to perform such action is utilized. The results are discussed in light of recent theories put forward to explain how parieto-frontal connectivity is modulated by the execution of prehensile movements

    Intentional Binding effect in children: insights from a new paradigm

    Get PDF
    open3Intentional binding (IB) refers to the temporal attraction between a voluntary action and its sensory consequence. Since its discovery in 2002, it has been considered to be a valid implicit measure of sense of agency (SoA), since it only occurs in the context of voluntary actions. The vast majority of studies considering IB have recruited young adults as participants, while neglecting possible age related differences. The aim of the present work is to study the development of IB in 10-year-old children. In place of Libet's classical clock method, we decided to implement a new and more suitable paradigm in order to study IB, since children could have some difficulties in dealing with reading clocks. A stream of unpredictable letters was therefore used participants had to remember which letter was on the screen when they made a voluntary action, heard a sound, or felt their right index finger moved down passively. In Experiment I, a group of young adults was tested in order to replicate the IB effect with this new paradigm. In Experiment II, the same paradigm was then administered to children in order to investigate whether such an effect has already emerged at this age. The data from Experiment I showed the presence of the IB effect in adults. However, Experiment II demonstrated a clear reduction of IB. The comparison of the two groups revealed that the young adult group differed from the children, showing a significantly stronger linkage between actions and their consequences. The results indicate a developmental trend in the IB effect. This finding is discussed in light of the maturation process of the frontal cortical network.restrictedpartially_openCAVAZZANA A; BEGLIOMINI C; BISIACCHI PCavazzana, Annachiara; Begliomini, Chiara; Bisiacchi, Patrizi

    Potential for social involvement modulates activity within the mirror and the mentalizing systems

    Get PDF
    Processing biological motion is fundamental for everyday life activities, such as social interaction, motor learning and nonverbal communication. The ability to detect the nature of a motor pattern has been investigated by means of point-light displays (PLD), sets of moving light points reproducing human kinematics, easily recognizable as meaningful once in motion. Although PLD are rudimentary, the human brain can decipher their content including social intentions. Neuroimaging studies suggest that inferring the social meaning conveyed by PLD could rely on both the Mirror Neuron System (MNS) and the Mentalizing System (MS), but their specific role to this endeavor remains uncertain. We describe a functional magnetic resonance imaging experiment in which participants had to judge whether visually presented PLD and videoclips of human-like walkers (HL) were facing towards or away from them. Results show that coding for stimulus direction specifically engages the MNS when considering PLD moving away from the observer, while the nature of the stimulus reveals a dissociation between MNS -mainly involved in coding for PLD- and MS, recruited by HL moving away. These results suggest that the contribution of the two systems can be modulated by the nature of the observed stimulus and its potential for social involvement

    Decoding social intentions in human prehensile actions: Insights from a combined kinematics-fMRI study

    Get PDF
    Consistent evidence suggests that the way we reach and grasp an object is modulated not only by object properties (e.g., size, shape, texture, fragility and weight), but also by the types of intention driving the action, among which the intention to interact with another agent (i.e., social intention). Action observation studies ascribe the neural substrate of this `intentional' component to the putative mirror neuron (pMNS) and the mentalizing (MS) systems. How social intentions are translated into executed actions, however, has yet to be addressed. We conducted a kinematic and a functional Magnetic Resonance Imaging (fMRI) study considering a reach-to-grasp movement performed towards the same object positioned at the same location but with different intentions: passing it to another person (social condition) or putting it on a concave base (individual condition). Kinematics showed that individual and social intentions are characterized by different profiles, with a slower movement at the level of both the reaching (i.e., arm movement) and the grasping (i.e., hand aperture) components. fMRI results showed that: (i) distinct voxel pattern activity for the social and the individual condition are present within the pMNS and the MS during action execution; (ii) decoding accuracies of regions belonging to the pMNS and the MS are correlated, suggesting that these two systems could interact for the generation of appropriate motor commands. Results are discussed in terms of motor simulation and inferential processes as part of a hierarchical generative model for action intention understanding and generation of appropriate motor commands

    Cortical Activations in Humans Grasp-Related Areas Depend on Hand Used and Handedness

    Get PDF
    Background: In non-human primates grasp-related sensorimotor transformations are accomplished in a circuit involving the anterior intraparietal sulcus (area AIP) and both the ventral and the dorsal sectors of the premotor cortex (vPMC and dPMC, respectively). Although a human homologue of such a circuit has been identified whether activity within this circuit varies depending on handedness has yet to be investigated. Methodology/Principal Findings: We used functional magnetic resonance imaging (fMRI) to explicitly test how handedness modulates activity within human grasping-related brain areas. Right- and left-handers subjects were requested to reach towards and grasp an object with either the right or the left hand using a precision grip while scanned. A kinematic study was conducted with similar procedures as a behavioral counterpart for the fMRI experiment. Results from a factorial design revealed significant activity within the right dPMC, the right cerebellum and AIP bilaterally. The pattern of activity within these areas mirrored the results found for the behavioral study. Conclusion/Significance: Data are discussed in terms of an handedness-independent role for the right dPMC in monitoring hand shaping, the need for bilateral AIP activity for the performance of precision grip movements which varies depending on handedness and the involvement of the cerebellum in terms of its connections with AIP. These results provide the first compelling evidence of specific grasping related neural activity depending on handedness

    Decoding social intentions in human prehensile actions : insights from a combined kinematics-fMRI study

    Get PDF
    Funding: This work was supported by a grant from the MIUR (N. 287713), the FP7: REWIRE project, by Progetto Strategico, Universitaà di Padova (N. 2010XPMFW4) to UC and by SIR grant (Scientific Independence of Young Researchers—N. RBSI141QKX) to LS.Consistent evidence suggests that the way we reach and grasp an object is modulated not only by object properties (e.g., size, shape, texture, fragility and weight), but also by the types of intention driving the action, among which the intention to interact with another agent (i.e., social intention). Action observation studies ascribe the neural substrate of this ‘intentional’ component to the putative mirror neuron (pMNS) and the mentalizing (MS) systems. How social intentions are translated into executed actions, however, has yet to be addressed. We conducted a kinematic and a functional Magnetic Resonance Imaging (fMRI) study considering a reach-to-grasp movement performed towards the same object positioned at the same location but with different intentions: passing it to another person (social condition) or putting it on a concave base (individual condition). Kinematics showed that individual and social intentions are characterized by different profiles, with a slower movement at the level of both the reaching (i.e., arm movement) and the grasping (i.e., hand aperture) components. fMRI results showed that: (i) distinct voxel pattern activity for the social and the individual condition are present within the pMNS and the MS during action execution; (ii) decoding accuracies of regions belonging to the pMNS and the MS are correlated, suggesting that these two systems could interact for the generation of appropriate motor commands. Results are discussed in terms of motor simulation and inferential processes as part of a hierarchical generative model for action intention understanding and generation of appropriate motor commands.Publisher PDFPeer reviewe

    Structure of the motor descending pathways correlates with the temporal kinematics of hand movements

    Get PDF
    Simple Summary: How hand motor behavior relates to the microstructure of the underlying subcortical white matter pathways is yet to be fully understood. Here we consider two well-known examples of our everyday motor repertoire, reaching and reach-to-grasp, by looking at their temporal unfolding and at the microstructure of descending projection pathways, conveying motor information from the motor cortices towards the more ventral regions of the nervous system. We combine three-dimensional kinematics, describing the temporal profile of hand movements, with diffusion imaging tractography, exploring the microstructure of specific segments of the projection pathways (internal capsule, corticospinal and hand motor tracts). The results indicate that the level of anisotropy characterizing these white matter tracts can influence the temporal unfolding of reaching and reach-to-grasp movements. Abstract: The projection system, a complex organization of ascending and descending white matter pathways, is the principal system for conveying sensory and motor information, connecting frontal and sensorimotor regions with ventral regions of the central nervous system. The corticospinal tract (CST), one of the principal projection pathways, carries distal movement-related information from the cortex to the spinal cord, and whether its microstructure is linked to the kinematics of hand movements is still an open question. The aim of the present study was to explore how microstructure of descending branches of the projection system, namely the hand motor tract (HMT), the corticospinal tract (CST) and its sector within the internal capsule (IC), can relate to the temporal profile of reaching and reach-to-grasp movements. Projection pathways of 31 healthy subjects were virtually dissected by means of diffusion tractography and the kinematics of reaching and reach-to-grasp movements were also analyzed. A positive association between Hindrance Modulated Orientation Anisotropy (HMOA) and kinematics was observed, suggesting that anisotropy of the considered tract can influence the temporal unfolding of motor performance. We highlight, for the first time, that hand kinematics and the visuomotor transformation processes underlying reaching and reach-to-grasp movements relate to the microstructure of specific projection fibers subserving these movements

    When Ears Drive Hands: The Influence of Contact Sound on Reaching to Grasp

    Get PDF
    Background Most research on the roles of auditory information and its interaction with vision has focused on perceptual performance. Little is known on the effects of sound cues on visually-guided hand movements. Methodology/Principal Findings We recorded the sound produced by the fingers upon contact as participants grasped stimulus objects which were covered with different materials. Then, in a further session the pre-recorded contact sounds were delivered to participants via headphones before or following the initiation of reach-to-grasp movements towards the stimulus objects. Reach-to-grasp movement kinematics were measured under the following conditions: (i) congruent, in which the presented contact sound and the contact sound elicited by the to-be-grasped stimulus corresponded; (ii) incongruent, in which the presented contact sound was different to that generated by the stimulus upon contact; (iii) control, in which a synthetic sound, not associated with a real event, was presented. Facilitation effects were found for congruent trials; interference effects were found for incongruent trials. In a second experiment, the upper and the lower parts of the stimulus were covered with different materials. The presented sound was always congruent with the material covering either the upper or the lower half of the stimulus. Participants consistently placed their fingers on the half of the stimulus that corresponded to the presented contact sound. Conclusions/Significance Altogether these findings offer a substantial contribution to the current debate about the type of object representations elicited by auditory stimuli and on the multisensory nature of the sensorimotor transformations underlying action
    • …
    corecore